<?xml version="1.0" encoding="ISO-8859-1"?>
<metadatalist>
	<metadata ReferenceType="Misc">
		<site>sibgrapi.sid.inpe.br 802</site>
		<identifier>8JMKD3MGPEW34M/45CGCRL</identifier>
		<repository>sid.inpe.br/sibgrapi/2021/09.04.02.00</repository>
		<lastupdate>2021:09.04.02.00.40 sid.inpe.br/banon/2001/03.30.15.38 administrator</lastupdate>
		<metadatarepository>sid.inpe.br/sibgrapi/2021/09.04.02.00.40</metadatarepository>
		<metadatalastupdate>2022:05.15.22.30.28 sid.inpe.br/banon/2001/03.30.15.38 administrator {D 2021}</metadatalastupdate>
		<citationkey>SousaFernVasc:2021:NoSeNe</citationkey>
		<title>ConformalLayers: A non-linear sequential neural network with associative layers</title>
		<shorttitle>Supplemerntary material</shorttitle>
		<format>On-line</format>
		<year>2021</year>
		<date>18-22 Oct. 2021</date>
		<numberoffiles>1</numberoffiles>
		<size>490 KiB</size>
		<author>Sousa, Eduardo Vera,</author>
		<author>Fernandes, Leandro A. F.,</author>
		<author>Vasconcelos, Cristina Nader,</author>
		<affiliation>Universidade Federal Fluminense</affiliation>
		<affiliation>Universidade Federal Fluminense</affiliation>
		<affiliation>Universidade Federal Fluminense</affiliation>
		<e-mailaddress>eduardovera@ic.uff.br</e-mailaddress>
		<transferableflag>1</transferableflag>
		<keywords>convolutional neural network, non-linear activation, associativity.</keywords>
		<abstract>Convolutional Neural Networks (CNNs) have been widely applied. But as the CNNs grow, the number of arithmetic operations and memory footprint also increases. Furthermore, typical non-linear activation functions do not allow associativity of the operations encoded by consecutive layers, preventing the simplification of intermediate steps by combining them. We present a new activation function that allows associativity between sequential layers of CNNs. Even though our activation function is non-linear, it can be represented by a sequence of linear operations in the conformal model for Euclidean geometry. In this domain, operations like, but not limited to, convolution, average pooling, and dropout remain linear. We take advantage of associativity to combine all the "conformal layers" and make the cost of inference constant regardless of the depth of the network.</abstract>
		<language>en</language>
		<targetfile>SupplementaryMaterial.pdf</targetfile>
		<usergroup>eduardovera@ic.uff.br</usergroup>
		<visibility>shown</visibility>
		<documentstage>not transferred</documentstage>
		<mirrorrepository>sid.inpe.br/banon/2001/03.30.15.38.24</mirrorrepository>
		<nexthigherunit>8JMKD3MGPEW34M/45CGCM8</nexthigherunit>
		<hostcollection>sid.inpe.br/banon/2001/03.30.15.38</hostcollection>
		<username>eduardovera@ic.uff.br</username>
		<agreement>agreement.html .htaccess .htaccess2</agreement>
		<lasthostcollection>sid.inpe.br/banon/2001/03.30.15.38</lasthostcollection>
		<url>http://sibgrapi.sid.inpe.br/rep-/sid.inpe.br/sibgrapi/2021/09.04.02.00</url>
	</metadata>
</metadatalist>